Complete Definition of "Markov chain"

English
Noun
Markov chain

  1. A discrete-time stochastic process with the Markov property.

Translations
Czech: Markovský �et�zec m
Estonian: Markovi ahel
Finnish: Markov-ketju, Markovin ketju
Polish: �a�cuch Markowa m

See also
w:Markov chain|Wikipedia article on Markov chains

Category:English eponyms

zh:Markov chain

Revision and Credits for"Markov chain"
Dictionary content provided from Wiktionary.org under the
GNU Free Documentation License
 
 

 Find:
  Words Starting With:
  Words Ending With:
  Words Containing:
  Words That Match:

 
 Translate Into:
  
Dutch   French   German
  
Italian   Spanish
    Show results per page.

Browse the Dictionary
A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

   
Allwords Copyright 1998-2024 All rights reserved.